标量和矢量场的神经近似(例如签名距离函数和辐射场)已成为准确的高质量表示。最先进的结果是通过从可训练的特征网格中进行查找的调节来获得的,这些近似是按照学习任务的一部分,并允许较小,更有效的神经网络。不幸的是,与独立的神经网络模型相比,这些特征网格通常以明显增加的记忆消耗成本。我们提出了一种词典方法,用于压缩此类特征网格,将其内存消耗降低至100倍,并允许多分辨率表示,这对于核心外流很有用。我们将词典优化作为矢量定量的自动码头问题提出,使我们能够在没有直接监督以及具有动态拓扑和结构的空间中学习端到端离散的神经表示。我们的源代码将在https://github.com/nv-tlabs/vqad上找到。
translated by 谷歌翻译
机器学习的最近进步已经创造了利用一类基于坐标的神经网络来解决视觉计算问题的兴趣,该基于坐标的神经网络在空间和时间跨空间和时间的场景或对象的物理属性。我们称之为神经领域的这些方法已经看到在3D形状和图像的合成中成功应用,人体的动画,3D重建和姿势估计。然而,由于在短时间内的快速进展,许多论文存在,但尚未出现全面的审查和制定问题。在本报告中,我们通过提供上下文,数学接地和对神经领域的文学进行广泛综述来解决这一限制。本报告涉及两种维度的研究。在第一部分中,我们通过识别神经字段方法的公共组件,包括不同的表示,架构,前向映射和泛化方法来专注于神经字段的技术。在第二部分中,我们专注于神经领域的应用在视觉计算中的不同问题,超越(例如,机器人,音频)。我们的评论显示了历史上和当前化身的视觉计算中已覆盖的主题的广度,展示了神经字段方法所带来的提高的质量,灵活性和能力。最后,我们展示了一个伴随着贡献本综述的生活版本,可以由社区不断更新。
translated by 谷歌翻译
Neural signed distance functions (SDFs) are emerging as an effective representation for 3D shapes. State-of-theart methods typically encode the SDF with a large, fixedsize neural network to approximate complex shapes with implicit surfaces. Rendering with these large networks is, however, computationally expensive since it requires many forward passes through the network for every pixel, making these representations impractical for real-time graphics. We introduce an efficient neural representation that, for the first time, enables real-time rendering of high-fidelity neural SDFs, while achieving state-of-the-art geometry reconstruction quality. We represent implicit surfaces using an octree-based feature volume which adaptively fits shapes with multiple discrete levels of detail (LODs), and enables continuous LOD with SDF interpolation. We further develop an efficient algorithm to directly render our novel neural SDF representation in real-time by querying only the necessary LODs with sparse octree traversal. We show that our representation is 2-3 orders of magnitude more efficient in terms of rendering speed compared to previous works. Furthermore, it produces state-of-the-art reconstruction quality for complex shapes under both 3D geometric and 2D image-space metrics.
translated by 谷歌翻译